# Case Insensitive
Pegasus Indonesian Base Finetune
Apache-2.0
This model is an Indonesian text summarization model based on the PEGASUS architecture, fine-tuned on the Indosum, Liputan6, and XLSum datasets, suitable for news text summarization tasks.
Text Generation
Transformers Other

P
thonyyy
172
2
Radbert
MIT
RadBERT is a radiology-specific BERT model initialized from BioBERT and continuously pre-trained on radiology report data, suitable for natural language processing tasks in biomedical and radiology fields.
Large Language Model
Transformers English

R
StanfordAIMI
1,749
23
Scibert Scivocab Uncased Finetuned Ner
A named entity recognition model fine-tuned based on the SciBERT pre-trained model for scientific domains
Sequence Labeling
Transformers

S
HenryHXR
19
0
Roberta Large Ner English
An English named entity recognition model fine-tuned on the conll2003 dataset based on roberta-large, specifically optimized for recognizing entities with non-initial capital letters
Sequence Labeling
Transformers English

R
ydshieh
36
2
Distilbert Base Uncased Agnews Student
MIT
This model is obtained through distillation from a zero-shot classification pipeline for AG News, primarily demonstrating how to distill an expensive zero-shot model into a more efficient student model.
Text Classification
Transformers English

D
joeddav
900
5
Bert Base Indonesian 522M
MIT
A BERT base model pretrained on Indonesian Wikipedia using Masked Language Modeling (MLM) objective, case insensitive.
Large Language Model Other
B
cahya
2,799
25
Distilbert Base Uncased Finetuned Conll03 English
Apache-2.0
A lightweight named entity recognition model based on DistilBERT, optimized for English text and case-insensitive.
Sequence Labeling
Transformers English

D
elastic
2,363
33
Indobert Large P2
MIT
IndoBERT is a state-of-the-art language model developed for Indonesian based on the BERT architecture, trained using Masked Language Modeling (MLM) and Next Sentence Prediction (NSP) objectives.
Large Language Model Other
I
indobenchmark
2,272
8
English Pert Large
PERT is a pre-trained language model based on the BERT architecture, supporting English task processing.
Large Language Model
Transformers English

E
hfl
23
3
Bert Base Indonesian 1.5G
MIT
This is a BERT-based Indonesian pretrained model trained on Wikipedia and newspaper data, suitable for various natural language processing tasks.
Large Language Model Other
B
cahya
40.08k
5
Indobert Lite Base P1
MIT
IndoBERT is a BERT model variant tailored for the Indonesian language, trained using masked language modeling and next sentence prediction objectives. The Lite version is a lightweight model suitable for resource-constrained environments.
Large Language Model
Transformers Other

I
indobenchmark
723
0
Danish Bert Botxo Ner Dane
This is a Danish pre-trained BERT model developed by Certainly (formerly BotXO), later fine-tuned by Malte Højmark-Bertelsen on the DaNE dataset for named entity recognition tasks.
Sequence Labeling Other
D
Maltehb
594
4
Featured Recommended AI Models